circle intersection
- North America (0.14)
- South America > Uruguay > Maldonado > Maldonado (0.04)
- Europe > United Kingdom > England > Oxfordshire > Oxford (0.04)
Appendix
Nevertheless, Query-Key Norm remains promisingly convergent with SDM's requirement of "Sparse Distributed Memory (SDM), is a biologically plausible form of associative memory". For each layer we take the maximum softmax input for a given text prompt for each head. We then take the mean of this maximum for each head and plot this for each text input. Hamming distances that read to and write from fewer patterns. Small and Large models aggregated across all text data for all heads and layers.
- North America (0.14)
- South America > Uruguay > Maldonado > Maldonado (0.04)
- Europe > United Kingdom > England > Oxfordshire > Oxford (0.04)
Attention Approximates Sparse Distributed Memory
Bricken, Trenton, Pehlevan, Cengiz
While Attention has come to be an important mechanism in deep learning, there remains limited intuition for why it works so well. Here, we show that Transformer Attention can be closely related under certain data conditions to Kanerva's Sparse Distributed Memory (SDM), a biologically plausible associative memory model. We confirm that these conditions are satisfied in pre-trained GPT2 Transformer models. We discuss the implications of the Attention-SDM map and provide new computational and biological interpretations of Attention.
- South America > Uruguay > Maldonado > Maldonado (0.04)
- Europe > United Kingdom > England > Oxfordshire > Oxford (0.04)
- Africa > Cameroon > Gulf of Guinea (0.04)